home *** CD-ROM | disk | FTP | other *** search
-
- Instructions on testing the HDF utility programs.
-
-
- hdf24to8:
-
- Copy head.r24 in the examples/files directory.
-
- Execute:
- hdf24to8 head.r24 head8.hdf
-
- View head8.hdf.
-
- Not yet tested: error traps
-
- ristosds:
-
- Copy the three HDF files storm110.hdf, storm120.hdf, and
- storm130.hdf from the examples/files directory.
-
- Execute:
- ristosds storm*.hdf -o storm.hdf
-
- Use hdfed to compare storm*.hdf with storm.hdf:
- In storm.hdf tag 702's element should be 38988 bytes.
- In storm110.hdf tag 302's element should be 12th of
- this, which is 3249. (It is a 57x57 image.)
- Compare the first few numbers in storm110's image
- with the first few numbers in storm.hdf's SDS. They
- should be the same.
-
-
- hdfpack:
-
- Copy the file test.cdf from the examples/files directory.
-
- Execute:
- hdfpack test.cdf test.pck
- hdfpack -b test.cdf test.blk
-
- Use hdfls to get a listing of test.cdf and test.pck. The only
- difference between the 2 listings should be that test.pck
- shouldn't have any special elements (they show up as "Unknown
- Tag") and it also shouldn't have any "Linked Block Indicators."
- Use the HDF browser to verify that the Vgroup named "Float" has a
- Vdata with ref no. 55 that contains the values:
- 0.0, 1.0, 2.0,..., 359.0
- The file sizes should be as follows:
- test.cdf - 11795
- test.pck - 6747
- test.blk - 8111
-
-
- hdftopal/paltohdf
-
- Copy the file palette.raw from the examples/files directory.
-
- Execute:
- paltohdf palette.raw palette.hdf
- hdftopal palette.hdf palette.raw.new
-
- Use hdfls with the '-l' option to examine the HDF palette file.
- It should have an 'Image Palette-8' and an 'Image Palette,'
- both with length 768 bytes. They should also have the same
- reference number.
- Use the Unix utility 'cmp' or something similar to do a byte-for-byte
- comparison of palette.raw and palette.raw.new. They should be
- identical.
-
-
- r8tohdf/hdftor8
-
- Copy the files storm*.raw and palette.raw from the examples/files directory.
-
- Execute:
- r8tohdf 57 57 storm.hdf storm*.raw
- r8tohdf 57 57 storm.hdf -p palette.raw -i storm110.raw
- hdftor8 storm.hdf
-
- Use hdfls with the '-l' option to examine the HDF file. It should
- contain five raster image sets, one of which will be compressed
- under IMCOMP compression. (If you do not put the '-p' in the
- second r8tohdf command above, you should get an error message.)
- The non-compressed rasters images should be the same length as
- the raw raster files. The compressed will be about 25% of that
- size.
- Use the Unix utility 'cmp' or something similar to do byte-for-byte
- comparisons on the produced raw raster files by hdftor8. There
- should be one more than you had at the start. One of them may
- not compare exactly with any one of the raw rasters,
- and the rest will compare with one of
- the other raw rasters. There is no guarantee about the order
- of the produced raw rasters, but it is likely they will be produced
- in the order in which they went into the file, which would be
- increasing numerical order, with the compressed image last.
-
-
- hdfcomp
-
- Copy the files storm*.hdf from the examples/files directory.
-
- Execute:
- hdfcomp allstorms.hdf storm*.hdf
- hdfcomp allcomp.hdf -c storm*.hdf
-
- Use hdfls with the '-l' option to examine the two HDF files. The first,
- allstorms.hdf, should simply hold the raster together in one file,
- with no compression. You can use hdfls to check the original files.
- The second file, allcomp.hdf, should hold all the rasters in a
- compress format. Run-Length Encoding (RLE) compression will result
- in modest savings - about 10% to 15% for these files.
-
- hdfed
-
- Copy the file storm110.hdf from the examples/files directory.
-
- Execute:
- hdfed storm110.hdf
-
- Running interactively, type the following commands:
-
- info -all
- prev tag = 300
- info -long
- dump -short
-
- The latter two commands should result in the following responses:
-
- (6) Image Dimensions : (Tag 300)
- Ref: 110, Offset: 3459, Length: 20 (bytes)
- 0: 0 57 0 57 106 110
- 12: 1 0 0 0
-
-
- Type help and experiment. Most of the information can be verified
- with hdfls. Be sure to type 'close' then 'quit' when you are finished.
-
- CAUTION: This utility is currently using the Unix utility 'od' (octal dump)
- to examine the contents of the HDF file. This utility does not exist under
- PC and Mac environments. Also, because of the internal storage methods of
- HDF, little-endian machines will give garbled results. Also, on some
- machines, such as the Cray and Convex, the dump will be different from
- the one shown above because the wordsize is different.
-
-
-